Online Linear Optimization with Sparsity
ثبت نشده
چکیده
Now, let us consider the case ofK = Kb with b ∈ (1,∞). For v ∈ R andQ ⊆ [d], let vQ denote the 6 projection of v to those dimensions inQ. Then for any v ∈ R, and any w ∈ Kb withQ = {i : wi 6= 7 0}, we know by Hölder’s inequality that 〈w,v〉 = 〈wQ,vQ〉 ≥ −‖w‖b · ‖vQ‖a , for a = b/(b− 1). 8 Moreover, one can have 〈wQ,vQ〉 = −‖w‖b · ‖vQ‖a , when |wi| /‖w‖b = |vi|/‖v‖a and 9 wivi ≤ 0 for every i ∈ Q. Thus, to find w ∈ Kb which minimizes 〈w,v〉, we first letQ = [k] so that 10 ‖vQ‖a ≥ ‖vQ′‖a for any Q ′ ⊆ [d] with |Q′| ≤ k. Then we let ŵi = −sign(vi)|vi|1i∈Q, and 11 choose wi = ŵi/‖ŵ‖b. Clearly, we have w ∈ K and 〈w,v〉 = −‖vQ‖a ≤ −‖vQ′‖a ≤ 〈w ′,v〉 12 for any w′ ∈ K with Q′ = {i : w′ i 6= 0} as |Q′| ≤ k. 13
منابع مشابه
Online Linear Optimization with Sparsity Constraints
We study the problem of online linear optimization with sparsity constraints in the 1 semi-bandit setting. It can be seen as a marriage between two well-known problems: 2 the online linear optimization problem and the combinatorial bandit problem. For 3 this problem, we provide two algorithms which are efficient and achieve sublinear 4 regret bounds. Moreover, we extend our results to two gener...
متن کاملOnline Linear Optimization with Sparsity Constraints
We study the problem of online linear optimization with sparsity constraints in the 1 semi-bandit setting. It can be seen as a marriage between two well-known problems: 2 the online linear optimization problem and the combinatorial bandit problem. For 3 this problem, we provide two algorithms which are efficient and achieve sublinear 4 regret bounds. Moreover, we extend our results to two gener...
متن کاملControl Problems with Equality Constraints
not available at time of publication. Dimitris Bertsimas Massachusetts Insitute of Technology [email protected] MS8 Less is More: Robustness and Sparsity in Multivariate Statistics We describe applications of robust optimization in the context of machine learning, with a focus on classification and regression, principal component analysis and fitting of graphical models. In these tasks, sparsity...
متن کاملOnline Portfolio Selection with Group Sparsity
In portfolio selection, it often might be preferable to focus on a few top performing industries/sectors to beat the market. These top performing sectors however might change over time. In this paper, we propose an online portfolio selection algorithm that can take advantage of sector information through the use of a group sparsity inducing regularizer while making lazy updates to the portfolio...
متن کاملExploiting sparsity in linear and nonlinear matrix inequalities via positive semidefinite matrix completion
Abstract A basic framework for exploiting sparsity via positive semidefinite matrix completion is presented for an optimization problem with linear and nonlinear matrix inequalities. The sparsity, characterized with a chordal graph structure, can be detected in the variable matrix or in a linear or nonlinear matrix-inequality constraint of the problem. We classify the sparsity in two types, the...
متن کاملGeneralized Iterative Thresholding for Sparsity-Aware Online Volterra System Identification
The present paper explores the link between thresholding, one of the key enablers in sparsity-promoting algorithms, and Volterra system identification in the context of time-adaptive or online learning. A connection is established between the recently developed generalized thresholding operator and optimization theory via the concept of proximal mappings which are associated with non-convex pen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017